84 research outputs found

    Metabonomics evaluations of age-related changes in the urinary compositions of male Sprague Dawley rats and effects of data normalization methods on statistical and quantitative analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Urine from male Sprague-Dawley rats 25, 40, and 80 days old was analyzed by NMR and UPLC/MS. The effects of data normalization procedures on principal component analysis (PCA) and quantitative analysis of NMR-based metabonomics data were investigated. Additionally, the effects of age on the metabolic profiles were examined by both NMR and UPLC/MS analyses.</p> <p>Results</p> <p>The data normalization factor was shown to have a great impact on the statistical and quantitative results indicating the need to carefully consider how to best normalize the data within a particular study and when comparing different studies. PCA applied to the data obtained from both NMR and UPLC/MS platforms reveals similar age-related differences. NMR indicated many metabolites associated with the Krebs cycle decrease while citrate and 2-oxoglutarate, also associated with the Krebs cycle, increase in older rats.</p> <p>Conclusion</p> <p>This study compared four different normalization methods for the NMR-based metabonomics spectra from an age-related study. It was shown that each method of normalization has a great effect on both the statistical and quantitative analyses. Each normalization method resulted in altered relative positions of significant PCA loadings for each sample spectra but it did not alter which chemical shifts had the highest loadings. The greater the normalization factor was related to age, the greater the separation between age groups was observed in subsequent PCA analyses. The normalization factor that showed the least age dependence was total NMR intensity, which was consistent with UPLC/MS data. Normalization by total intensity attempts to make corrections due to dietary and water intake of the individual animal, which is especially useful in metabonomics evaluations of urine. Additionally, metabonomics evaluations of age-related effects showed decreased concentrations of many Krebs cycle intermediates along with increased levels of oxidized antioxidants in urine of older rats, which is consistent with current theories on aging and its association with diminishing mitochondrial function and increasing levels of reactive oxygen species. Analysis of urine by both NMR and UPLC/MS provides a comprehensive and complementary means of examining metabolic events in aging rats.</p

    Quality assurance and quality control processes:summary of a metabolomics community questionnaire

    Get PDF
    Introduction The Metabolomics Society Data Quality Task Group (DQTG) developed a questionnaire regarding quality assurance (QA) and quality control (QC) to provide baseline information about current QA and QC practices applied in the international metabolomics community. Objectives The DQTG has a long-term goal of promoting robust QA and QC in the metabolomics community through increased awareness via communication, outreach and education, and through the promotion of best working practices. An assessment of current QA and QC practices will serve as a foundation for future activities and development of appropriate guidelines. Method QA was defined as the set of procedures that are performed in advance of analysis of samples and that are used to improve data quality. QC was defined as the set of activities that a laboratory does during or immediately after analysis that are applied to demonstrate the quality of project data. A questionnaire was developed that included 70 questions covering demographic information, QA approaches and QC approaches and allowed all respondents to answer a subset or all of the questions. Result The DQTG questionnaire received 97 individual responses from 84 institutions in all fields of metabolomics covering NMR, LC-MS, GC-MS, and other analytical technologies. Conclusion There was a vast range of responses concerning the use of QA and QC approaches that indicated the limited availability of suitable training, lack of Standard Operating Procedures (SOPs) to review and make decisions on quality, and limited use of standard reference materials (SRMs) as QC materials. The DQTG QA/QC questionnaire has for the first time demonstrated that QA and QC usage is not uniform across metabolomics laboratories. Here we present recommendations on how to address the issues concerning QA and QC measurements and reporting in metabolomics

    Modeling Chemical Interaction Profiles: II. Molecular Docking, Spectral Data-Activity Relationship, and Structure-Activity Relationship Models for Potent and Weak Inhibitors of Cytochrome P450 CYP3A4 Isozyme

    Get PDF
    Polypharmacy increasingly has become a topic of public health concern, particularly as the U.S. population ages. Drug labels often contain insufficient information to enable the clinician to safely use multiple drugs. Because many of the drugs are bio-transformed by cytochrome P450 (CYP) enzymes, inhibition of CYP activity has long been associated with potentially adverse health effects. In an attempt to reduce the uncertainty pertaining to CYP-mediated drug-drug/chemical interactions, an interagency collaborative group developed a consensus approach to prioritizing information concerning CYP inhibition. The consensus involved computational molecular docking, spectral data-activity relationship (SDAR), and structure-activity relationship (SAR) models that addressed the clinical potency of CYP inhibition. The models were built upon chemicals that were categorized as either potent or weak inhibitors of the CYP3A4 isozyme. The categorization was carried out using information from clinical trials because currently available in vitro high-throughput screening data were not fully representative of the in vivo potency of inhibition. During categorization it was found that compounds, which break the Lipinski rule of five by molecular weight, were about twice more likely to be inhibitors of CYP3A4 compared to those, which obey the rule. Similarly, among inhibitors that break the rule, potent inhibitors were 2–3 times more frequent. The molecular docking classification relied on logistic regression, by which the docking scores from different docking algorithms, CYP3A4 three-dimensional structures, and binding sites on them were combined in a unified probabilistic model. The SDAR models employed a multiple linear regression approach applied to binned 1D 13C-NMR and 1D 15N-NMR spectral descriptors. Structure-based and physical-chemical descriptors were used as the basis for developing SAR models by the decision forest method. Thirty-three potent inhibitors and 88 weak inhibitors of CYP3A4 were used to train the models. Using these models, a synthetic majority rules consensus classifier was implemented, while the confidence of estimation was assigned following the percent agreement strategy. The classifier was applied to a testing set of 120 inhibitors not included in the development of the models. Five compounds of the test set, including known strong inhibitors dalfopristin and tioconazole, were classified as probable potent inhibitors of CYP3A4. Other known strong inhibitors, such as lopinavir, oltipraz, quercetin, raloxifene, and troglitazone, were among 18 compounds classified as plausible potent inhibitors of CYP3A4. The consensus estimation of inhibition potency is expected to aid in the nomination of pharmaceuticals, dietary supplements, environmental pollutants, and occupational and other chemicals for in-depth evaluation of the CYP3A4 inhibitory activity. It may serve also as an estimate of chemical interactions via CYP3A4 metabolic pharmacokinetic pathways occurring through polypharmacy and nutritional and environmental exposures to chemical mixtures

    Quality assurance and quality control reporting in untargeted metabolic phenotyping: mQACC recommendations for analytical quality management

    Get PDF
    Background Demonstrating that the data produced in metabolic phenotyping investigations (metabolomics/metabonomics) is of good quality is increasingly seen as a key factor in gaining acceptance for the results of such studies. The use of established quality control (QC) protocols, including appropriate QC samples, is an important and evolving aspect of this process. However, inadequate or incorrect reporting of the QA/QC procedures followed in the study may lead to misinterpretation or overemphasis of the findings and prevent future metanalysis of the body of work. Objective The aim of this guidance is to provide researchers with a framework that encourages them to describe quality assessment and quality control procedures and outcomes in mass spectrometry and nuclear magnetic resonance spectroscopy-based methods in untargeted metabolomics, with a focus on reporting on QC samples in sufficient detail for them to be understood, trusted and replicated. There is no intent to be proscriptive with regard to analytical best practices; rather, guidance for reporting QA/QC procedures is suggested. A template that can be completed as studies progress to ensure that relevant data is collected, and further documents, are provided as on-line resources. Key reporting practices Multiple topics should be considered when reporting QA/QC protocols and outcomes for metabolic phenotyping data. Coverage should include the role(s), sources, types, preparation and uses of the QC materials and samples generally employed in the generation of metabolomic data. Details such as sample matrices and sample preparation, the use of test mixtures and system suitability tests, blanks and technique-specific factors are considered and methods for reporting are discussed, including the importance of reporting the acceptance criteria for the QCs. To this end, the reporting of the QC samples and results are considered at two levels of detail: “minimal” and “best reporting practice” levels

    Metabolomics enables precision medicine: “A White Paper, Community Perspective”

    Get PDF
    Introduction: Background to metabolomics: Metabolomics is the comprehensive study of the metabolome, the repertoire of biochemicals (or small molecules) present in cells, tissues, and body fluids. The study of metabolism at the global or “-omics” level is a rapidly growing field that has the potential to have a profound impact upon medical practice. At the center of metabolomics, is the concept that a person’s metabolic state provides a close representation of that individual’s overall health status. This metabolic state reflects what has been encoded by the genome, and modified by diet, environmental factors, and the gut microbiome. The metabolic profile provides a quantifiable readout of biochemical state from normal physiology to diverse pathophysiologies in a manner that is often not obvious from gene expression analyses. Today, clinicians capture only a very small part of the information contained in the metabolome, as they routinely measure only a narrow set of blood chemistry analytes to assess health and disease states. Examples include measuring glucose to monitor diabetes, measuring cholesterol and high density lipoprotein/low density lipoprotein ratio to assess cardiovascular health, BUN and creatinine for renal disorders, and measuring a panel of metabolites to diagnose potential inborn errors of metabolism in neonates. Objectives of White Paper—expected treatment outcomes and metabolomics enabling tool for precision medicine: We anticipate that the narrow range of chemical analyses in current use by the medical community today will be replaced in the future by analyses that reveal a far more comprehensive metabolic signature. This signature is expected to describe global biochemical aberrations that reflect patterns of variance in states of wellness, more accurately describe specific diseases and their progression, and greatly aid in differential diagnosis. Such future metabolic signatures will: (1) provide predictive, prognostic, diagnostic, and surrogate markers of diverse disease states; (2) inform on underlying molecular mechanisms of diseases; (3) allow for sub-classification of diseases, and stratification of patients based on metabolic pathways impacted; (4) reveal biomarkers for drug response phenotypes, providing an effective means to predict variation in a subject’s response to treatment (pharmacometabolomics); (5) define a metabotype for each specific genotype, offering a functional read-out for genetic variants: (6) provide a means to monitor response and recurrence of diseases, such as cancers: (7) describe the molecular landscape in human performance applications and extreme environments. Importantly, sophisticated metabolomic analytical platforms and informatics tools have recently been developed that make it possible to measure thousands of metabolites in blood, other body fluids, and tissues. Such tools also enable more robust analysis of response to treatment. New insights have been gained about mechanisms of diseases, including neuropsychiatric disorders, cardiovascular disease, cancers, diabetes and a range of pathologies. A series of ground breaking studies supported by National Institute of Health (NIH) through the Pharmacometabolomics Research Network and its partnership with the Pharmacogenomics Research Network illustrate how a patient’s metabotype at baseline, prior to treatment, during treatment, and post-treatment, can inform about treatment outcomes and variations in responsiveness to drugs (e.g., statins, antidepressants, antihypertensives and antiplatelet therapies). These studies along with several others also exemplify how metabolomics data can complement and inform genetic data in defining ethnic, sex, and gender basis for variation in responses to treatment, which illustrates how pharmacometabolomics and pharmacogenomics are complementary and powerful tools for precision medicine. Conclusions: Key scientific concepts and recommendations for precision medicine: Our metabolomics community believes that inclusion of metabolomics data in precision medicine initiatives is timely and will provide an extremely valuable layer of data that compliments and informs other data obtained by these important initiatives. Our Metabolomics Society, through its “Precision Medicine and Pharmacometabolomics Task Group”, with input from our metabolomics community at large, has developed this White Paper where we discuss the value and approaches for including metabolomics data in large precision medicine initiatives. This White Paper offers recommendations for the selection of state of-the-art metabolomics platforms and approaches that offer the widest biochemical coverage, considers critical sample collection and preservation, as well as standardization of measurements, among other important topics. We anticipate that our metabolomics community will have representation in large precision medicine initiatives to provide input with regard to sample acquisition/preservation, selection of optimal omics technologies, and key issues regarding data collection, interpretation, and dissemination. We strongly recommend the collection and biobanking of samples for precision medicine initiatives that will take into consideration needs for large-scale metabolic phenotyping studie

    The time is now: Achieving FH paediatric screening across Europe – The Prague Declaration

    Get PDF
    ReviewFamilial Hypercholesterolaemia (FH) is severely under-recognized, under-diagnosed and under-treated in Europe, leading to a significantly higher risk of premature cardiovascular diseases in those affected. FH stands for inherited, very high cholesterol and affects 1:300 individuals regardless of their age, race, sex, and lifestyle, making it the most common inherited metabolic disorder and a non-modifiable cardiovascular disease risk factor in the world..info:eu-repo/semantics/publishedVersio

    Single valproic acid treatment inhibits glycogen and RNA ribose turnover while disrupting glucose-derived cholesterol synthesis in liver as revealed by the [U-13C6]-d-glucose tracer in mice

    Get PDF
    Previous genetic and proteomic studies identified altered activity of various enzymes such as those of fatty acid metabolism and glycogen synthesis after a single toxic dose of valproic acid (VPA) in rats. In this study, we demonstrate the effect of VPA on metabolite synthesis flux rates and the possible use of abnormal 13C labeled glucose-derived metabolites in plasma or urine as early markers of toxicity. Female CD-1 mice were injected subcutaneously with saline or 600 mg/kg) VPA. Twelve hours later, the mice were injected with an intraperitoneal load of 1 g/kg [U-13C]-d-glucose. 13C isotopomers of glycogen glucose and RNA ribose in liver, kidney and brain tissue, as well as glucose disposal via cholesterol and glucose in the plasma and urine were determined. The levels of all of the positional 13C isotopomers of glucose were similar in plasma, suggesting that a single VPA dose does not disturb glucose absorption, uptake or hepatic glucose metabolism. Three-hour urine samples showed an increase in the injected tracer indicating a decreased glucose re-absorption via kidney tubules. 13C labeled glucose deposited as liver glycogen or as ribose of RNA were decreased by VPA treatment; incorporation of 13C via acetyl-CoA into plasma cholesterol was significantly lower at 60 min. The severe decreases in glucose-derived carbon flux into plasma and kidney-bound cholesterol, liver glycogen and RNA ribose synthesis, as well as decreased glucose re-absorption and an increased disposal via urine all serve as early flux markers of VPA-induced adverse metabolic effects in the host

    An untargeted multi-technique metabolomics approach to studying intracellular metabolites of HepG2 cells exposed to 2,3,7,8-tetrachlorodibenzo-p-dioxin

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>In vitro </it>cell systems together with omics methods represent promising alternatives to conventional animal models for toxicity testing. Transcriptomic and proteomic approaches have been widely applied <it>in vitro </it>but relatively few studies have used metabolomics. Therefore, the goal of the present study was to develop an untargeted methodology for performing reproducible metabolomics on <it>in vitro </it>systems. The human liver cell line HepG2, and the well-known hepatotoxic and non-genotoxic carcinogen 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), were used as the <it>in vitro </it>model system and model toxicant, respectively.</p> <p>Results</p> <p>The study focused on the analysis of intracellular metabolites using NMR, LC-MS and GC-MS, with emphasis on the reproducibility and repeatability of the data. State of the art pre-processing and alignment tools and multivariate statistics were used to detect significantly altered levels of metabolites after exposing HepG2 cells to TCDD. Several metabolites identified using databases, literature and LC-nanomate-Orbitrap analysis were affected by the treatment. The observed changes in metabolite levels are discussed in relation to the reported effects of TCDD.</p> <p>Conclusions</p> <p>Untargeted profiling of the polar and apolar metabolites of <it>in vitro </it>cultured HepG2 cells is a valid approach to studying the effects of TCDD on the cell metabolome. The approach described in this research demonstrates that highly reproducible experiments and correct normalization of the datasets are essential for obtaining reliable results. The effects of TCDD on HepG2 cells reported herein are in agreement with previous studies and serve to validate the procedures used in the present work.</p

    Emerging technologies and their impact on regulatory science

    Get PDF
    There is an evolution and increasing need for the utilization of emerging cellular, molecular and in silico technologies and novel approaches for safety assessment of food, drugs, and personal care products. Convergence of these emerging technologies is also enabling rapid advances and approaches that may impact regulatory decisions and approvals. Although the development of emerging technologies may allow rapid advances in regulatory decision making, there is concern that these new technologies have not been thoroughly evaluated to determine if they are ready for regulatory application, singularly or in combinations. The magnitude of these combined technical advances may outpace the ability to assess fit for purpose and to allow routine application of these new methods for regulatory purposes. There is a need to develop strategies to evaluate the new technologies to determine which ones are ready for regulatory use. The opportunity to apply these potentially faster, more accurate, and cost-effective approaches remains an important goal to facilitate their incorporation into regulatory use. However, without a clear strategy to evaluate emerging technologies rapidly and appropriately, the value of these efforts may go unrecognized or may take longer. It is important for the regulatory science field to keep up with the research in these technically advanced areas and to understand the science behind these new approaches. The regulatory field must understand the critical quality attributes of these novel approaches and learn from each other's experience so that workforces can be trained to prepare for emerging global regulatory challenges. Moreover, it is essential that the regulatory community must work with the technology developers to harness collective capabilities towards developing a strategy for evaluation of these new and novel assessment tools
    corecore